|
A red team is an independent group that challenges an organization to improve its effectiveness. The United States intelligence community (military and civilian) has red teams that explore alternative futures and write articles as if they were foreign world leaders. Little formal doctrine or publications about Red Teaming in the military exist.〔(【引用サイトリンク】title=Strengthened Through the Challenge )〕 Private business, especially those heavily invested as government contractors/defense contractors such as IBM and SAIC, and U.S. government agencies such as the CIA, have long used Red Teams. Red Teams in the United States armed forces were used much more frequently after a 2003 Defense Science Review Board recommended them to help prevent the shortcomings that led up to the attacks of September 11, 2001. The U.S. Army then stood up a service-level Red Team, the Army Directed Studies Office, in 2004. This was the first service-level Red Team and until 2011 was the largest in the DoD.〔 Penetration testers assess organization security, often unbeknownst to client staff. This type of Red Team provides a more realistic picture of the security readiness than exercises, role playing, or announced assessments. The Red Team may trigger active controls and countermeasures within a given operational environment. In wargaming, the opposing force (or OPFOR) in a simulated military conflict may be referred to as a red cell (a very narrow form of Red Teaming) and may also engage in red team activity. The key theme is that the aggressor is composed of various threat actors, equipment and techniques that are at least partially unknown by the defenders. The red cell challenges the operations planning by playing the role of a thinking enemy. In United States war-gaming simulations, the U.S. force is always the Blue Team and the opposing force is always the Red Team. When applied to intelligence work, red-teaming is sometimes called ''alternative analysis''. When used in a hacking context, a red team is a group of white-hat hackers that attack an organization's digital infrastructure as an attacker would in order to test the organization's defenses (often known as "penetration testing"). Companies including Microsoft perform regular exercises under which both red and blue teams are utilized. Benefits include challenges to preconceived notions and clarifying the problem state that planners are attempting to mitigate. More accurate understanding can be developed of how sensitive information is externalized and of exploitable patterns and instances of bias. == History == Billy Mitchell – a passionate early advocate of air power – demonstrated the obsolescence of battleships in bombings against the captured World War I German battleship ''Ostfriesland'' and the U.S.pre-dreadnought battleship ''Alabama''. Rear Admiral Harry E. Yarnell demonstrated in 1932 the effectiveness of an attack on Pearl Harbor almost exactly showing how the tactics of the Japanese would destroy the fleet in harbor nine years later. Although the umpires ruled the exercise a total success, the umpire's report on the overall exercises makes no mention of the stunning effectiveness of the simulated attack. Their conclusion to what became known as Fleet Problem XIII was surprisingly quite the opposite: :: ''It is doubtful if air attacks can be launched against Oahu in the face of strong defensive aviation without subjecting the attacking carriers to the danger of material damage and consequent great losses in the attack air force.'' 〔 〕 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Red team」の詳細全文を読む スポンサード リンク
|